775 research outputs found

    On stochasticity in nearly-elastic systems

    Full text link
    Nearly-elastic model systems with one or two degrees of freedom are considered: the system is undergoing a small loss of energy in each collision with the "wall". We show that instabilities in this purely deterministic system lead to stochasticity of its long-time behavior. Various ways to give a rigorous meaning to the last statement are considered. All of them, if applicable, lead to the same stochasticity which is described explicitly. So that the stochasticity of the long-time behavior is an intrinsic property of the deterministic systems.Comment: 35 pages, 12 figures, already online at Stochastics and Dynamic

    Transfer Entropy as a Log-likelihood Ratio

    Full text link
    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic chi-squared distribution is established for the transfer entropy estimator. The result generalises the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense

    Optimistic Agents are Asymptotically Optimal

    Full text link
    We use optimism to introduce generic asymptotically optimal reinforcement learning agents. They achieve, with an arbitrary finite or compact class of environments, asymptotically optimal behavior. Furthermore, in the finite deterministic case we provide finite error bounds.Comment: 13 LaTeX page

    Entropy and Hausdorff Dimension in Random Growing Trees

    Full text link
    We investigate the limiting behavior of random tree growth in preferential attachment models. The tree stems from a root, and we add vertices to the system one-by-one at random, according to a rule which depends on the degree distribution of the already existing tree. The so-called weight function, in terms of which the rule of attachment is formulated, is such that each vertex in the tree can have at most K children. We define the concept of a certain random measure mu on the leaves of the limiting tree, which captures a global property of the tree growth in a natural way. We prove that the Hausdorff and the packing dimension of this limiting measure is equal and constant with probability one. Moreover, the local dimension of mu equals the Hausdorff dimension at mu-almost every point. We give an explicit formula for the dimension, given the rule of attachment
    • …
    corecore